multi-class classification
- Research Report > Experimental Study (0.92)
- Research Report > New Finding (0.92)
Multi-Class Learning: From Theory to Algorithm
Jian Li, Yong Liu, Rong Yin, Hua Zhang, Lizhong Ding, Weiping Wang
Moreover,the proposed multi-class kernel learning algorithms have statistical guarantees and fast convergence rates. Experimental results on lots of benchmark datasets show that our proposed methods can significantly outperform the existing multi-class classification methods. The major contributions ofthispaper include: 1)Anewlocal Rademacher complexitybased bound withfastconvergence rate for multi-class classification is established. Existing works [16,27] for multi-class classifiers with Rademacher complexity does not take into account couplings among different classes.
- North America > Canada > Quebec > Montreal (0.04)
- Europe > Czechia > Prague (0.04)
- Asia > Middle East > UAE (0.04)
- (2 more...)
- North America > United States > New York (0.04)
- Europe > Sweden > Uppsala County > Uppsala (0.04)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- (2 more...)
- North America > Canada > British Columbia > Metro Vancouver Regional District > Vancouver (0.04)
- Asia > Middle East > Jordan (0.04)
SupplementaryMaterial CanLessbeMore? WhenIncreasing-to-Balancing LabelNoiseRatesConsideredBeneficial
A.10 Extensiontomulti-class As explained at the beginning, our algorithm can largely extend to the multi-class/group setting. The task is to predict whether an individual's income exceeds50K.Thedatasetconsists of48,842examples and28features. Fairface, the face attribute dataset containing 108,501 images with balanced race and gender groups [15]. We use a pre-trained vision transformer (ViT/B-32) model [8] to extract image representations, and project them into 50-dimensional feature vectors. For constrained learning, we categorize race into White and Non-White groups.
- North America > United States > Wisconsin > Dane County > Madison (0.14)
- North America > United States > California > San Mateo County > Menlo Park (0.04)
- Asia > Middle East > Jordan (0.04)
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
Demystifying the Optimal Performance of Multi-Class Classification
Classification is a fundamental task in science and engineering on which machine learning methods have shown outstanding performances. However, it is challenging to determine whether such methods have achieved the Bayes error rate, that is, the lowest error rate attained by any classifier. This is mainly due to the fact that the Bayes error rate is not known in general and hence, effectively estimating it is paramount. Inspired by the work by Ishida et al. (2023), we propose an estimator for the Bayes error rate of supervised multi-class classification problems. We analyze several theoretical aspects of such estimator, including its consistency, unbiasedness, convergence rate, variance, and robustness. We also propose a denoising method that reduces the noise that potentially corrupts the data labels, and we improve the robustness of the proposed estimator to outliers by incorporating the median-of-means estimator. Our analysis demonstrates the consistency, asymptotic unbiasedness, convergence rate, and robustness of the proposed estimators.
A Universal Growth Rate for Learning with Smooth Surrogate Losses
This paper presents a comprehensive analysis of the growth rate of $H$-consistency bounds (and excess error bounds) for various surrogate losses used in classification. We prove a square-root growth rate near zero for smooth margin-based surrogate losses in binary classification, providing both upper and lower bounds under mild assumptions.